Open Problem: Tensor Decompositions: Algorithms up to the Uniqueness Threshold?

نویسندگان

  • Aditya Bhaskara
  • Moses Charikar
  • Ankur Moitra
  • Aravindan Vijayaraghavan
چکیده

Factor analysis is a basic tool in statistics and machine learning, where the goal is to take many variables and explain them away using fewer unobserved variables, called factors. It was introduced in a pioneering study by psychologist Charles Spearman, who used it to test his theory that there are fundamentally two types of intelligence – verbal and mathematical. This study has had a deep influence on modern psychology, to this day. However there is a serious mathematical limitation to this approach, which we describe next. In its most basic form, we are given a matrix M = ∑R i=1 ai ⊗ bi. Our goal is to recover the factors {ai}i and {bi}i. However this decomposition is only unique if we add further assumptions, such as requiring the factors {ai}i and {bi}i to be orthonormal. Otherwise we could apply an R×R rotation to the factors {ai}i and its transpose to {bi}i and recover another valid decomposition. This is often called the rotation problem and has been a central stumbling block for factor analysis since Spearman’s work. To summarize: Even if there is a factorization M = ∑R i=1 ai ⊗ bi that has a meaningful interpretation, there is no guarantee that factor analysis finds it!

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tensor Decompositions via Two-Mode Higher-Order SVD (HOSVD)

Tensor decompositions have rich applications in statistics and machine learning, and developing efficient, accurate algorithms for the problem has received much attention recently. Here, we present a new method built on Kruskal’s uniqueness theorem to decompose symmetric, nearly orthogonally decomposable tensors. Unlike the classical higher-order singular value decomposition which unfolds a ten...

متن کامل

Uniqueness of Tensor Decompositions with Applications to Polynomial Identifiability

We give a robust version of the celebrated result of Kruskal on the uniqueness of tensor decompositions: we prove that given a tensor whose decomposition satisfies a robust form of Kruskal’s rank condition, it is possible to approximately recover the decomposition if the tensor is known up to a sufficiently small (inverse polynomial) error. Kruskal’s theorem has found many applications in provi...

متن کامل

Instability, Isolation, and the Tridecompositional Uniqueness Theorem

The tridecompositional uniqueness theorem of Elby and Bub (1994) shows that a wavefunction in a triple tensor product Hilbert space has at most one decomposition into a sum of product wavefunctions with each set of component wavefunctions linearly independent. I demonstrate that, in many circumstances, the unique component wavefunctions and the coefficients in the expansion are both hopelessly ...

متن کامل

Are Tensor Decomposition Solutions Unique? On the Global Convergence HOSVD and ParaFac Algorithms

For tensor decompositions such as HOSVD and ParaFac, the objective functions are nonconvex. This implies, theoretically, there exists a large number of local optimas: starting from different starting point, the iteratively improved solution will converge to different local solutions. This non-uniqueness present a stability and reliability problem for image compression and retrieval. In this pap...

متن کامل

Decompositions of a Higher-Order Tensor in Block Terms - Part II: Definitions and Uniqueness

In this paper we introduce a new class of tensor decompositions. Intuitively, we decompose a given tensor block into blocks of smaller size, where the size is characterized by a set of mode-n ranks. We study different types of such decompositions. For each type we derive conditions under which essential uniqueness is guaranteed. The parallel factor decomposition and Tucker’s decomposition can b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014